16. Max-Min Problems

a. Local Minima, Local Maxima and Saddle Points

4. Second Derivative Test with More Variables (Optional)

The second derivative test generalizes to functions with more than two variables. First notice that the discriminant, \(D\), is the determinant of the two-dimensional Hessian matrix: \[ D=\begin{vmatrix} f_{xx} & f_{xy} \\ f_{yx} & f_{yy} \end{vmatrix} =f_{xx}f_{yy}-f_{xy}^2 \] For each additional variable, we have a bigger Hessian determinant:
  \(D_1=f_{xx}\)   \(D_2=\begin{vmatrix} f_{xx} & f_{xy} \\ f_{yx} & f_{yy} \\ \end{vmatrix}\)   \(D_3=\begin{vmatrix} f_{xx} & f_{xy} & f_{xz} \\ f_{yx} & f_{yy} & f_{yz} \\ f_{zx} & f_{zy} & f_{zz} \\ \end{vmatrix}\)   \(D_4=\begin{vmatrix} f_{xx} & f_{xy} & f_{xz} & f_{xu} \\ f_{yx} & f_{yy} & f_{yz} & f_{yu} \\ f_{zx} & f_{zy} & f_{zz} & f_{zu} \\ f_{ux} & f_{uy} & f_{uz} & f_{uu} \\ \end{vmatrix}\)   \(D_5=\begin{vmatrix} f_{xx} & f_{xy} & f_{xz} & f_{xu} & f_{xv} \\ f_{yx} & f_{yy} & f_{yz} & f_{yu} & f_{yv} \\ f_{zx} & f_{zy} & f_{zz} & f_{zu} & f_{zv} \\ f_{ux} & f_{uy} & f_{uz} & f_{uu} & f_{uv} \\ f_{vx} & f_{vy} & f_{vz} & f_{vu} & f_{vv} \\ \end{vmatrix}\)   \(\cdots\)   \(D_n=\begin{vmatrix} f_{xx} & f_{xy} & f_{xz} & f_{xu} & \cdots & f_{xw} \\ f_{yx} & f_{yy} & f_{yz} & f_{yu} & \cdots & f_{yw} \\ f_{zx} & f_{zy} & f_{zz} & f_{zu} & \cdots & f_{zw} \\ f_{ux} & f_{uy} & f_{uz} & f_{uu} & \cdots & f_{uw} \\ \vdots & \vdots & \vdots & \vdots & \ddots & \vdots \\ f_{wx} & f_{wy} & f_{wz} & f_{wu} & \cdots & f_{ww} \\ \end{vmatrix}\)
These determinants are called the leading principle minor determinants (LPMD's) of the Hessian matrix. The Second Derivative Test is stated in terms of the LPMD's.

Consider a function of \(n\) variables \(f(\vec x)\) and assume \(\vec c\) is a critical point of \(f\), i.e. \[ \vec\nabla f(\vec c)=\vec0 \] Assume the LPMD's have been evaluated at \(\vec c\).

  1. If all the LPMD's are positive, i.e. \(D_1 > 0, D_2 > 0, D_3 > 0, \cdots, D_n > 0\), then the critical point \(\vec c\) is a local minimum.
  2. If the odd LPMD's are all negative and the even LPMD's are all positive, i.e. \(D_1 < 0, D_3 < 0, D_5 < 0,\cdots\) and \(D_2 > 0, D_4 > 0, D_6 > 0,\cdots\), then the critical point \(\vec c\) is a local maximum.
  3. If \(D_n \ne 0\) but cases (1) and (2) do not hold, then the critical point \(\vec c\) is a saddle point.
  4. Otherwise, (i.e. when \(D_n=0\)) the TEST FAILS.

Justification

To justify the second derivative test, we will only look at the cases where all the mixed second partial derivatives are \(0\). In that case, the Hessian is diagonal and the \(k^\text{th}\) LPMD is the product of the first \(k\) direct second partial derivatives, e.g. \(D_4=f_{xx}f_{yy}f_{zz}f_{uu}\). In particular, \(D_n\) is the product of all \(n\) direct second partial derivatives. In the first three cases, \(D_n \ne 0\). So all the direct second partial derivatives are nonzero.

If all the LPMD's are positive, i.e. \(D_1 > 0, D_2 > 0, D_3 > 0, \cdots, D_n > 0\), then all the direct second partial derivatives are positive. So the function, \(f\), is concave up along all the coordinate axes directions and we expect that \(\vec c\) is a local minimum. A more detailed proof would show that \(f\), is concave up in all directions.

If the odd LPMD's are all negative and the even LPMD's are all positive, i.e. \(D_1 < 0, D_3 < 0, D_5 < 0, \cdots\) and \(D_2 > 0, D_4 > 0, D_6 > 0, \cdots\), then all the direct second partial derivatives are negative. So the function, \(f\), is concave down along all the coordinate axes directions and we expect that \(\vec c\) is a local maximum. A more detailed proof would show that \(f\), is concave down in all directions.

If \(D_n \ne 0\) but cases (1) and (2) do not hold, then the function \(f\) is concave up along some coordinate direction and concave down along some other coordinate direction and so \(\vec c\) is a saddle point. A more detailed proof would show that \(f\), is concave up in some direction and concave down in some other direction.

In this class, you will most likely never be asked to apply the Second Derivative Test to any function with more than two variables. So this page was included essentially for conceptual and theoretical reasons. However, here is an example anyway:

Find all critical points of \(f(x,y,z)=3x^2-2xy-2xz-4x+y^2+z^2\) and use the second derivative test to classify each as a local maxima, local minima or saddle or say the test fails.

We first compute the partial derivatives and set them equal to zero. \[\begin{aligned} f_x&=6x-2y-2z-4=0 \qquad \text{(1)} \\ f_y&=-2x+2y=0 \qquad \qquad \quad \text{(2)} \\ f_z&=-2x+2z=0 \qquad \qquad \quad \text{(3)} \end{aligned}\] Equations (2) and (3) say \(y=x\) and \(z=x\). So equation (1) becomes \(2x-4=0\). Consequently, the critical point is \((x,y,z)=(2,2,2)\). The Hessian is \[ \text{Hess} f =\begin{pmatrix} f_{xx} & f_{xy} & f_{xz} \\ f_{yx} & f_{yy} & f_{yz} \\ f_{zx} & f_{zy} & f_{zz} \end{pmatrix} =\begin{pmatrix} 6 & -2 & -2 \\ -2 & 2 & 0 \\ -2 & 0 & 2 \end{pmatrix} \] So the leading principal minor determinants are \[\begin{aligned} D_1&=f_{xx}=6 \\ D_2 &=\begin{vmatrix} f_{xx} & f_{xy}\\ f_{yx} & f_{yy} \end{vmatrix} =\begin{vmatrix} 6 & -2 \\ -2 & 2 \end{vmatrix} =8 \\ D_3 &=\begin{vmatrix} f_{xx} & f_{xy} & f_{xz} \\ f_{yx} & f_{yy} & f_{yz} \\ f_{zx} & f_{zy} & f_{zz} \end{vmatrix} =\begin{vmatrix} 6 & -2 & -2 \\ -2 & 2 & 0 \\ -2 & 0 & 2 \end{vmatrix} =8 \end{aligned}\] Since these are all positive, the critical point, \((x,y,z)=(2,2,2)\), is a minimum.

© MYMathApps

Supported in part by NSF Grant #1123255